High-entropy alloy catalysts: Fundamental aspects, promises towards electrochemical NH3 production, and lessons to learn from deep neural networks

نویسندگان

چکیده

A computational approach to judiciously predict high-entropy alloys (HEAs) as an efficient and sustainable material class for the electrochemical reduction of nitrogen is here presented. The employs density functional theory (DFT), adsorption energies N atoms N2 molecules descriptors catalytic activity deep neural networks. probabilistic quantifying HEA catalysts reaction (NRR) described, where catalyst elements concentration are optimized increase probability specific atomic arrangements on surfaces. provides key features effective filtering candidates without need time-consuming calculations. relationships between selectivity, which correlate with averaged valence electron electronegativity reference catalyst, analyzed in terms sufficient interaction sustained reactions and, at same time, release active site. As a result, complete list 3000 HEAs consisting quinary components Mo, Cr, Mn, Fe, Co, Ni, Cu, Zn reported together their metrics rank them from most likely least NRR gas diffusion electrodes, or case non-aqueous electrolytes utilized suppress competing hydrogen evolution reaction. Moreover, energetic landscape transformations computed compared Fe. study also analyses discusses how results would translate liquid-solid aqueous cells, further affected by changes properties upon hydroxylation, oxygen, hydrogen, water coverages.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards Evolutionary Deep Neural Networks

This paper is concerned with the problem of optimizing deep neural networks with diverse transfer functions using evolutionary methods. Standard evolutionary (SEDeeNN) and cooperative coevolutionary methods (CoDeeNN) were applied to three different architectures characterized by different constraints on neural diversity. It was found that (1) SEDeeNN (but not CoDeeNN) changes parameters uniform...

متن کامل

Deep Forest: Towards An Alternative to Deep Neural Networks

In this paper, we propose gcForest, a decision tree ensemble approach with performance highly competitive to deep neural networks. In contrast to deep neural networks which require great effort in hyperparameter tuning, gcForest is much easier to train. Actually, even when gcForest is applied to different data from different domains, excellent performance can be achieved by almost same settings...

متن کامل

Learning to Learn Neural Networks

Meta-learning consists in learning learning algorithms. We use a Long Short Term Memory (LSTM) based network to learn to compute on-line updates of the parameters of another neural network. These parameters are stored in the cell state of the LSTM. Our framework allows to compare learned algorithms to hand-made algorithms within the traditional train and test methodology. In an experiment, we l...

متن کامل

Towards Robust Deep Neural Networks with BANG

Machine learning models, including state-of-the-art deep neural networks, are vulnerable to small perturbations that cause unexpected classification errors. This unexpected lack of robustness raises fundamental questions about their generalization properties and poses a serious concern for practical deployments. As such perturbations can remain imperceptible – commonly called adversarial exampl...

متن کامل

Optimization for Problem Classes – Neural Networks that Learn to Learn –

The main focus of the optimization of artificial neural networks has been the design of a problem dependent network structure in order to reduce the model complexity and to minimize the model error. Driven by a concrete application we identify in this paper another desirable property of neural networks – the ability of the network to efficiently solve related problems denoted as a class of prob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Nano Energy

سال: 2023

ISSN: ['2211-3282', '2211-2855']

DOI: https://doi.org/10.1016/j.nanoen.2022.108027